DR. DANIEL E. KODITSCHEK

GRASP Lab & Electrical and Systems Engineering Department
School of Engineering and Applied Science, University of Pennsylvania

Affordances of Animals and Machines

<Abstract>

The natural science of organismal biology is concerned with understanding the role of the environment in shaping the design of architectures (cognitively and energetically endowed bodies) to achieve the tasks of living and reproducing. The synthetic science of robotics is concerned with designing architectures (computationally and energetically endowed bodies) capable of performing work in an environment, to achieve the tasks that a user specifies. The tasks of living often entail performing work in the environment, hence robotics has derived huge benefit from studying, abstracting, and realizing aspects of animal architecture in its designs. However, notwithstanding decades of attention from naturalists and cognitive psychologists, animals’ ingenuity in recruiting affordances – features of their environment that facilitate their tasks – remains far less studied, hence, less exploited by roboticists.
This talk will first review decades of my lab’s collaboration with biologists on the composition of architectures to achieve desired taks within specified environments. That work has increased our understanding of how to build robot bodies and behaviors as well as helped biologists hypothesize how the environment shapes animal designs. I will then shift to consider the important converse problem of affordances – architecturally informed identification of environmental features that can facilitate tasks. I will present a brief account of more recent work in my lab over the past decade that attempts to develop an abstracted, compositional view of environmental features suitable for reasoning about their possible role as affordances to be exploited. The talk will conclude with speculative remarks suggesting how further collaborative work with biologists on the problem of affordances might offer new approaches to the study of animal innovation while at the same time help generate more innovative robot behavior in extra-terrestrial environments that no animal has yet encountered.

 
References
[1] M. H. Raibert, Legged Robots That Balance. Cambridge: MIT Press, 1986.
[2] D. E. Koditschek, “What Is Robotics? Why Do We Need It and How Can We Get It?,” Annu. Rev. Control Robot. Auton. Syst., vol. 4, no. 1, pp. 1–33, May 2021, doi: 10.1146/annurev-control-080320-011601.
[3] R. J. Full and D. E. Koditschek, “Templates and anchors: neuromechanical hypotheses of legged locomotion on land,” J. Exp. Biol., vol. 202, pp. 3325–3332, 1999.
[4] A. De, S. A. Burden, and D. E. Koditschek, “A hybrid dynamical extension of averaging and its application to the analysis of legged gait stability:,” Int. J. Robot. Res., vol. 37, no. 2–3, pp. 266–286, Mar. 2018, doi: 10.1177/0278364918756498.
[5] R. R. Burridge, A. A. Rizzi, and D. E. Koditschek, “Toward a systems theory for the composition of dexterous robot behaviours,” Robot. Res. Seventh Int. Symp. ISRR’95, pp. 149–161.
[6] V. Vasilopoulos et al., “Reactive Semantic Planning in Unexplored Semantic Environments Using Deep Perceptual Feedback,” IEEE Robot. Autom. Lett., vol. 5, no. 3, pp. 4455–4462, Jul. 2020, doi: 10.1109/LRA.2020.3001496.
[7] V. Vasilopoulos, Y. Kantaros, G. Pappas, and D. Koditschek, “Reactive Planning for Mobile Manipulation Tasks in Unexplored Semantic Environments,” IEEE Int. Conf. Robot. Autom. ICRA, May 2021, [Online]. Available: https://repository.upenn.edu/ese_papers/880
[8] T. T. Topping, V. Vasilopoulos, A. De, and D. Koditschek E., “Composition of Templates for Transitional Pedipulation Behaviors,” in Proc. Int. Symp. Rob. Res., 2019, pp. 626–641. [Online]. Available: https://repository.upenn.edu/ese_papers/860/
[9] V. Vasilopoulos, G. Pavlakos, K. Schmeckpeper, K. Daniilidis, and D. E. Koditschek, “Reactive navigation in partially familiar planar environments using semantic perceptual feedback,” Int. J. Robot. Res., vol. 41, no. 1, pp. 85–126, Jan. 2022, doi: 10.1177/02783649211048931.